Energy Relaxation for Hopfield Network with the New Learning Rule
نویسنده
چکیده
In this paper, the time for energy relaxation for LittleHopfield neural network using the new activation rule is shown to be better than the relaxation time using Hebbian learning. However, this should be so given the characteristics of the activation function and show through computer simulations that this is indeed so. In this paper, it has been proven that the new learning rule has a higher capacity than Hebb rule by computer simulations. section 3; logic programming on a neural network focused on the Hopfield model is described. In section 4, Hebb and the new learning rule are been discussed. This is foIIowed by section 5, where fitness landscapes are discussed. In section 6, theory implementation of the both learning rules are been discussed. Meanwhile, section 7 contains discussion regarding the results obtained from computer simulations. Finally concluding remarks regarding this work occupy the last section. 2. THE LITTLE-HOPFIELD MODEL (3) which monotone decreases with the dynamics. (1) (2) _ '" '" (3) '" (2) (I) hi .... + L.L.JUk S/k + LJU Sj +J; i k j where " ....." denotes still higher orders, and an energy function can be written as foIIows: The Hopfield model [7,8] is a standard model for associative memory. The Little dynamics is asynchronous, with each neuron updating their state deterministically. The system consists of N formal neurons, each of which is described by an Ising variable [9] Si(t),(i = l,2, ....N) . Neurons then are bipolar, SiE{-l,1 }, obeying the dynamics Si ---? sgn(hi ) , where the field, hi = ·L.J&2)Vj + J?) , i and j running over all neurons j N, J;j2) is the synaptic strength from neuron j to neuron i, and -Ji is the threshold of neuron i. Restricting the connections to be symmetric and zero-diagonal J(2) = J('?) J~..2) =0 aIIows one to write a 'IJ JI' II ' Lyapunov or energy function, The two-connection model can be generalized to include higher order connections. This modifies the "field" to be
منابع مشابه
Upgrading Logic Programming in Hopfield Network
The convergence property for doing logic programming in Hopfield network can be accelerated by using new relaxation method. This paper shows that the performance of the Hopfield network can be improved by using a relaxation rate to control the energy relaxation process. The capacity and performance of these networks is tested by using computer simulations. It was proven by computer simulations ...
متن کاملA Heuristic and Its Mathematical Analogue within Artificial Neural Network Adaptation Context
This paper presents an observation on adaptation of Hopfield neural network dynamics configured as a relaxation-based search algorithm for static optimization. More specifically, two adaptation rules, one heuristically formulated and the second being gradient descent based, for updating constraint weighting coefficients of Hopfield neural network dynamics are discussed. Application of two adapt...
متن کاملNeuro-Optimizer: A New Artificial Intelligent Optimization Tool and Its Application for Robot Optimal Controller Design
The main objective of this paper is to introduce a new intelligent optimization technique that uses a predictioncorrectionstrategy supported by a recurrent neural network for finding a near optimal solution of a givenobjective function. Recently there have been attempts for using artificial neural networks (ANNs) in optimizationproblems and some types of ANNs such as Hopfield network and Boltzm...
متن کاملAugmented Lagrangian Hopfield Network for Combined Heat and Power Economic Dispatch
This paper proposes an augmented Lagrangian Hopfield network (ALHN) for combined heat and power economic dispatch (CHPED) problem. The ALHN is the continuous Hopfield neural network based on augmented Lagrangian relaxation as its energy function. In the proposed ALHN, its energy function is augmented by Hopfield terms from Hopfield neural network and penalty factors from augmented Lagrangian re...
متن کاملAdiabatic quantum optimization for associative memory recall
2 Hopfield networks are a variant of associative memory that recall patterns stored in the 3 couplings of an Ising model. Stored memories are conventionally accessed as fixed points in the 4 network dynamics that correspond to energetic minima of the spin state. We show that memories 5 stored in a Hopfield network may also be recalled by energy minimization using adiabatic 6 quantum optimizatio...
متن کامل